Conversation
swiatkowski
left a comment
There was a problem hiding this comment.
Please address my comments, but looks good in general!
| else: | ||
| raise ValueError("Couldn't find your loss_function: " + self._config['loss_function']['name']) | ||
|
|
||
| loss_function.cuda() |
|
|
||
| for epoch in range(50): # loop over the dataset multiple times | ||
|
|
||
| print "Epoch {}/50".format(epoch) |
| image_a_depth_numpy = np.asarray(image_a_depth) | ||
| image_b_depth_numpy = np.asarray(image_b_depth) | ||
|
|
||
| image_a_mask_numpy = np.asarray(image_a_mask) |
There was a problem hiding this comment.
Shouldn't this be behind some flag? Like loss==controstive-loss etc?
There was a problem hiding this comment.
This is a workaround of this issue: RobotLocomotion/pytorch-dense-correspondence#204
Short story: few masks files are empty. If so skip this training sample.
There was a problem hiding this comment.
@gasiortomasz What I meant on Slack is that we should add a comment which you wrote here, but in the code above this section of the code.
| @@ -0,0 +1,139 @@ | |||
| import numpy as np | |||
There was a problem hiding this comment.
Lets add a comment on top of this file where this code comes from. This will give more context to the readers and might help with debugging. Also, please clarify what is directly taken from the R2D2 code and what you needed to change/add.
| image_a_depth_numpy = np.asarray(image_a_depth) | ||
| image_b_depth_numpy = np.asarray(image_b_depth) | ||
|
|
||
| image_a_mask_numpy = np.asarray(image_a_mask) |
There was a problem hiding this comment.
@gasiortomasz What I meant on Slack is that we should add a comment which you wrote here, but in the code above this section of the code.
| import torch.nn as nn | ||
| import torch.nn.functional as F | ||
|
|
||
| # this class is taken from https://github.com/naver/r2d2/blob/master/nets/ap_loss.py |
There was a problem hiding this comment.
@gasiortomasz Usually you would include such comments at the of the docstring of the class. Here behind the line starting "Note: typically..."
There was a problem hiding this comment.
This applies to all the comments you added here.
|
|
||
| for epoch in range(50): # loop over the dataset multiple times | ||
|
|
||
| print "Epoch {}/50".format(epoch) |
Add AP loss classes and Sampler class for sampling non-correspondences.
Add support for loss choice in training class